With genAI models, size matters (and smaller may be better) – Computerworld
Brown described data scientists as unicorns right now — rare and often demanding the pay of a mythical creature, as well. “And rightly so,” he said.
Most organizations can only employ a handful of data scientists at best, whether due to a scarcity of qualified talent or the cost of employing them, “which creates bottlenecks when it comes to effectively training and tuning the model,” he said.
A move to hybrid?
CIOs, Brown noted, have long been moving away from monolithic technologies — starting with the shift from UNIX to Linux in the early 2000s. He believes AI is at a similar turning point and argues that a hybrid strategy, similar to that of hybrid cloud, is most advantageous for deploying AI models. While the large, somewhat amorphous LLMs are in the spotlight today, the future IT environment is 50% applications and 50% SLMs.
“Data lives everywhere, whether it’s on-premises, in the cloud or at the edge. Therefore, data by nature is hybrid, and because AI needs to run where your data lives, it must also be hybrid,” Brown said. “In fact, we often tell customers and partners: AI is the ultimate hybrid workload.
“Essentially, a CIO will have as many AI models as applications. This means that training needs to be faster, tuning needs to speed up and costs need to be kept down. The key to this challenge lies in open source,” he continued. “Just as it democratized computing, open source will do so for AI; it already is.”
Source link